专利摘要:

公开号:SE1350065A1
申请号:SE1350065
申请日:2013-01-22
公开日:2014-07-23
发明作者:Joachim Samuelsson
申请人:Crunchfish Ab;
IPC主号:
专利说明:

15 20 25 30 Wherein said controller is configured to detect and track an object via a video stream provided by a camera, and indicate an operating area on the display Which is currently open for manipulation by the tracked object by changing displaying properties of a marker area on the display.
In one embodiment the controller is further configured to indicate the operating area by only changing the display properties of the marker area.
Such a computing device enables for an improved visual feedback to a user in that the displayed content or the display is cluttered, obscured or hid.
In one embodiment the displaying properties is the color, contrast and / or brightness of the marker area.
In one embodiment the marker area has an extension and the controller is further con fi gured to detect that the tracked object is moved in a direction substantially perpendicular to the plane of the display and in response thereto adapt the marker area, by further increasing the displaying properties of the marker area and / or the extension of the marker area.
In one embodiment, the computing device is a mobile communications terminal. In one embodiment, the computing device is a tablet computer or a laptop computer. In one embodiment, the computing device is a game conso le. In one embodiment, the computing device is a media device such as a television set or media system.
It is also an object of the teachings of this application to overcome the problems listed above by providing a method for use in a computing device comprising a display, said method comprises detecting and tracking an object via a video stream provided by a camera, and indicating an operating area on the display Which is currently open for manipulation by the tracked object by changing displaying properties of a marker area on the display.
It is a further object of the teachings of this application to overcome the problems listed above by providing a computer readable medium comprising instructions that When loaded into and executed by a controller, such as a processor, cause the execution of a method according to herein. 10 15 20 25 30 The inventors of the present invention have realized, after inventive and insightful reasoning that by (only) changing the display properties of a marker area there is no need to display a cursor or other visual object indicating a current operating area which may obstruct, hide or clutter displayed content on a display. The display properties are changed in a manner to increase their visibility, not necessarily the discemibleness of objects within the marker area, so that the position can be easily discemible and spotted by a user so that the user is made aware of where the operating area currently is. In one embodiment the displaying properties of the marker area are changed so that the original display content of the marker area is modified or thwarted to further increase the marker area ° s discemibleness.
Furthermore, in a touchless user interface a user continuously moves his hand inside and outside the camera view, much as a user moves his hand to the keypad and away from the keypad. A marker that is to indicate the position of a tracked object will then be jumping around on the display which will be confusing to a user. A user will perceive a so ft, but discemible, change in the displaying properties such as changed contrast, brightness or color as less confusing in that it provides a softer change of the displayed content in contrast to the abrupt appearance of a new object -the marker.
The teachings herein find use in control systems for devices having user interfaces such as mobile phones, smart phones, tablet computers, computers (portable and stationary), gaming consoles and media and other infotainment devices.
Other features and advantages of the disclosed embodiments will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings. Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein.
All references to "a / an / the [element, device, component, means, step, etc]" are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF DRAWINGS 10 15 20 25 30 Figures 1A, IB and lC are schematic views of each a computing device according to the teachings herein; Figure 2 is a schematic view of the components of a computing device according to the teachings herein; Figure 3 is a schematic view of a computer-readable memory according to the teachings herein; Figure 4A and 4B show an example embodiment according to the teachings herein; and Figure 5 shows a ch owchart illustrating a general method according to an embodiment of the teachings herein.
DETAILED DESCRIPTION The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in Which certain embodiments of the invention are shown. This invention may, however, be embo died in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Like numbers refer to like elements throughout.
Figure 1 generally shows a computing device 100 according to an embodiment herein. In one embodiment the computing device 100 is configured for network communication, either wireless or wired. Examples of a computing device 100 are: a personal computer, desktop or laptop, a tablet computer, a mobile communications terminal such as a mobile telephone, a smart phone, a personal digital assistant and a game console. Three embodiments will be exemplified and described as being a smartphone in figure 1A, a laptop computer 100 in figure 1B as an example of a computer and a TV 100 in figure 1C as an example of a media device. A media device is considered to be a computing device in the context of this application in the aspect that it is configured to receive digital content, process or compute the content and present the resulting or computed media, such as image (s) and / or audio. 10 15 20 25 30 Referring to fi gure 1A a mobile communications terminal in the form of a smartphone 100 comprises a housing 110 in which a display 120 is arranged. In one embodiment the display 120 is a touch display. In other embodiments the display 120 is a non-touch display. Furthermore, the smartphone 100 comprises two keys 130a, 130b.
In this embodiment there are two keys 130, but any number of keys is possible and depends on the design of the smartphone 100. In one embodiment the smartphone 100 is configured to display and operate a virtual key 135 on the touch display 120. It should be noted that the number of virtual keys 135 are dependent on the design of the smartphone 100 and an application that is executed on the smartphone 100. The smartphone 100 is also equipped with a camera 160. The camera 160 is a digital camera that is arranged to take video or still photographs by recording images on an electronic image sensor (not shown). In one embodiment the camera 160 is an external camera. In one embodiment the camera is altematively replaced by a source providing an image stream.
Referring to figure 1B a laptop computer 100 comprises a display 120 and a housing 110. The housing comprises a controller or CPU (not shown) and one or more computer-readable storage mediums (not shown), such as storage units and intemal memory. Examples of storage units are disk drives or hard drives. The laptop computer 100 further comprises at least one data port. Data ports can be wired and / or wireless.
Examples of data ports are USB (Universal Serial Bus) ports, Ethernet ports or WiFi (according to IEEE standard 802.11) ports. Data ports are configured to enable a laptop computer 100 to connect with other computing devices or a server.
The laptop computer 100 further comprises at least one input unit such as a keyboard 130. Other examples of input units are computer mouse, touch pads, touch screens or j oysticks to name a few.
The laptop computer 100 is further equipped with a camera 160. The camera 160 is a digital camera that is arranged to take video or still photographs by recording images on an electronic image sensor (not shown). In one embodiment the camera 160 is an extemal camera. In one embodiment the camera is altematively replaced by a source providing an image stream. 10 15 20 25 30 Referring to figure lC a media device, such as a television set, TV, 100 comprises a display 120 and a housing 110. The housing comprises a controller or CPU (not shown) and one or more computer-readable storage mediums (not shown), such as storage units and internal memory, for storing user settings and control software. The computing device 100 may further comprise at least one data port (not shown). Data ports can be Wired and / or wireless. Examples of data ports are USB (Universal Serial Bus) ports, Ethemet ports or WiFi (according to IEEE standard 802.11) ports. Such data ports are configured to enable the TV 100 to connect with an external storage medium, such as a USB stick, or to connect with other computing devices or a server.
The TV 100 may further comprise an input unit such as at least one key 130 or a remote control l30b for operating the TV 100.
The TV 100 is further equipped with a camera 160. The camera 160 is a digital camera that is arranged to take video or still photographs by recording images on an electronic image sensor (not shown). In one embodiment the camera 160 is an extemal camera. In one embodiment the camera is altematively replaced by a source providing an image stream.
Figure 2 shows a schematic view of the general structure of a device according to figure 1. The device 100 comprises a controller 210 which is responsible for the overall operation of the computing device 200 and is preferably implemented by any commercially available CPU ("Central Processing"). Unit "), DSP (" Digital Signal Processor ") or any other electronic programmable logic device. The controller 210 is configured to read instructions from the memory 240 and execute these instructions to control the operation of the computing device 100. The memory 240 may be implemented using any commonly known technology for computer-readable memories such as ROM, RAM, SRAM, DRAM, CMOS, FLASH, DDR, SDRAM or some other memory technology. The memory 240 is used for various purposes by the controller 210, one of them being for storing application data and program instructions 250 for various software modules in the computing device 200. The software modules include a real-time operating system, drivers for a user interface 220, an application handler as well as various applications 250. 10 15 20 25 30 The computing device 200 fi irther comprises a user interface 220, which in the computing device of figures 1A, 1B and 1C is comprised of the display 120 and the keys 130, 135.
The computing device 200 may further comprise a radio frequency interface 230, which is adapted to allow the computing device to communicate with other devices through a radio frequency band through the use of different radio frequency technologies. Examples of such technologies are IEEE 802.11, IEEE 802.15, ZigBee, WirelessHART, WIEI, Bluetooth®, W-CDMA / HSPA, GSM, UTRAN and LTE to name a few.
The computing device 200 is further equipped with a camera 260. The camera 260 is a digital camera that is arranged to take video or still photographs by recording images on an electronic image sensor (not shown).
The camera 260 is operably connected to the controller 210 to provide the controller with a video stream 265, i.e. the series of images captured, for further processing possibly for use in and / or according to one or several of the applications 250.
In one embodiment the camera 260 is an external camera or source of an image stream.
References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single / multi- processor architectures and sequential (Von Neumann) / parallel architectures but also specialized circuits such as field- programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc. should be understood to encompass so fi ware for a programmable processor or frmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed- fi anction device, gate array or programmable logic device etc.
Figure 3 shows a schematic view of a computer-readable medium as described in the above. The computer-readable medium 30 is in this embodiment a data disc 30. In one embodiment the data disc 30 is a magnetic data storage disc. The data disc 30 is 10 15 20 25 30 configured to carry instructions 31 that When loaded into a controller, such as a processor, executes a method or procedure according to the embodiments disclosed above. The data disc 30 is arranged to be connected to or within and read by a reading device 32, for loading the instructions into the controller. One such example of a reading device 32 in combination with one (or several) data disc (s) 30 is a hard drive. It should be noted that the computer-readable medium can also be other mediums such as compact discs, digital video discs, memories ash memories or other memory technologies commonly used.
The instructions 31 may also be downloaded to a computer data reading device 34, such as a laptop computer or other device capable of reading computer coded data on a computer-readable medium, by comprising the instructions 31 in a computer-readable signal 33 which is transmitted via a wireless (or Wired) interface (for example via the Intemet) to the computer data reading device 34 for loading the instructions 31 into a controller. In such an embodiment the computer-readable signal 33 is one type of a computer-readable medium 30.
The instructions may be stored in a memory (not shown explicitly in figure 3, but referenced 240 in figure 2) of the laptop computer 34.
References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fi xed-function device , gate array or programmable logic device etc.
An improved manner of providing visual feedback when tracking an object will be disclosed below with reference to the accompanying figures. The examples will be illustrated focusing on resulting visual feedback, but it should be clear that the processing is performed in part or fully in a computing device comprising a controller as disclosed above with reference to figures 1 and 2 or caused to be performed by executing instructions stored on a computer-readable medium as disclosed with reference to figure 3.
Figure 4A shows an example computing device such as in figure 1, in this example a laptop computer 100 such as the laptop computer 100 of figure 1B, 10 15 20 25 30 configured to detect and track an object, in this example a hand H, via the camera 160.
How such an object H is detected and tracked is disclosed in the Swedish patent application SE 1250910-5 and will not be discussed in further detail in the present application. For further details on this, please see the mentioned Swedish patent application. It should be noted, however, that the teachings of the present application may be implemented through the use of other tracking manners than disclosed in Swedish patent application SE 1250910-5.
The laptop computer is displaying a number of object 135 arranged to be manipulated on the display 120. To enable a user to understand how his actions and movements relating to the tracked hand H manipulates the displayed objects 135 the laptop computer 100 is configured to indicate a current position on the display which is currently open for manipulation by the hand H - an operating area - by changing the displaying properties of a marker area 170 on the display 120. The displaying properties that may be changed are the color, contrast and / or brightness.
This allows a user to clearly see where on the display 120 he is currently operating without the need for a cursor or other displayed object which may clutter the display 120 and hide, obscure or conceal underlying content. This is a problem especially in devices with relatively small screen such as smart phones and tablet computers.
The laptop computer 100 is thus con fi gured to indicate an operating area by only changing the displaying properties in a marker area 170.
The marker area 170 has an extension dl. The exact measurement of the extension depends on the user interface design and other parameters. In the example of figure 1 the extension is circular (shown elliptical due to the viewing angel). Typically the extension of the marker area 170 is initially 1 to 5 pixels in diameter, depending on display size and / or display resolution. The extension dl of the marker area 170 is small to avoid distorting the displayed content to a disturbing degree.
In on embodiment the extension of the marker area 170 equals the area which a user may manipulate and any manipulation effected by a user results in a manipulation of any and all objects within the marker area 170. 10 15 20 25 30 10 In one embodiment the center of the marker area 170 indicates the area Which a user may manipulate and any manipulation effected by a user results in a manipulation of an objects at or adjacent to the center of the marker area 170.
To enable the user to more clearly see which area he is currently operating within the laptop computer 100 is configured to detect that the tracked object, the hand H, is moved in a direction substantially perpendicular to the plane of the display 120 that is towards the display 120. Details on how such Z-axis detection may be implemented are disclosed in the Swedish patent application SE 1250910-5 and will not be discussed in further detail in the present application. For further details on this, please see the mentioned Swedish patent application. It should be noted, however, that the teachings of the present application may be implemented through the use of other tracking manners than disclosed in Swedish patent application SE 1250910-5.
As the laptop computer 100 detects a movement towards the display 120 of the tracked object H, the laptop computer 100 is configured to adapt the marker area 170.
The marker area 170 may be adapted by further changing the displaying properties by further increasing the Contrast and / or brightness of the marker area 170 or by further changing the color of the marker area 170.
The marker area 170 may be adapted by further changing the displaying properties by increasing trther increasing the extension of the marker area 170. As is shown in figure 4B, the hand H has moved from a distance Dl to a distance D2 from the display 120 and the marker area 170 has been adapted to an increased extension d2.
The extension dl, d2 and / or displaying properties of the marker area 170 may be dependent on the distance Dl, D2 of the tracked object H to the display 120. The dependency may be linear or stepwise. In an embo diment where the extension dl, d2 and / or displaying properties are stepwise dependent on the distance Dl, D2 the laptop computer 100 is configured to adapt the extension dl, d2 and / or displaying properties (incrementally) as the distance Dl, D2 changes below or above at least a first threshold distance.
In one embodiment the laptop computer 100 is configured to adapt the marker area increasingly as the distance D1, D2 is reduced (D1 to D2). This allows for a user to 10 15 20 25 30 ll more clearly focus on the area to be controlled and more clearly determine What action may be taken.
In one embodiment the laptop computer 100 is configured to adapt the marker area 170 increasingly as the distance D1, D2 is increased (D2 to D1). This allows for a user to more clearly see marker area 170 when at a distance, should the tracked movement be a result of the user moving. In such an embodiment the laptop computer 100 is further con fi gured to detect a user, possibly by detecting a face (not shown), in the vicinity of the tracked object H and determine if a distance to the face changes in the same manner as the distance D1, D2 to the tracked object H, which would be indicative of a user simply moving.
The marker area 170 may be adapted by further changing the display properties by further increasing the contrast and / or brightness in combination with increasing the extension.
It should be noted that the distance D 1, D2 should be understood to not be limited to the distance between the tracked object H and the display 120, but may also be a distance between the tracked object H and the camera 160.
In one embodiment the absolute value of the distance D1, D2 is not decisive for the extension dl, d2 or the changed displaying properties of the marker area 170. In such an embodiment it is the change in distance D1-D2 that is decisive.
In one embodiment the laptop computer 100 is configured to detect a tracked object H and in response thereto indicate a marker area 170 at an initial position and / or an initial extension. The initial position may be the middle of the display 120. The initial position may altematively be in a comer of the display 120. This allows a user to always start in the same position which enables the user to find the marker area 170 in a simple manner. The initial extension may be based on a detected distance or it may be a fixed initial extension, such as discussed in the above with relation to the first extension dl.
In one embodiment the laptop computer 100 is configured to detect a speed V (indicated in figure 4A with a speed vector V) of the tracked object H and determine whether the detected speed V is above a speed threshold and, if so, determine that the tracked movement is an event relating to an object 135 to be manipulated, such as a 10 15 20 25 30 12 select event or activate event. And, if the detected speed V is below the speed threshold determine that the marker area 170 should be adapted.
When estimating the velocity in a Z direction the controller may use the change in Z direction as disclosed in the Swedish patent application SE 125 0910-5. The change in z direction is measured by estimating the change in changes of the X and Y positions of the keypoints between two image frames, that is delta x and delta y. We then plot the delta x and delta y and fi t a straight line between the plots. The slope of this line gives a measurement of the change in Z direction. By dividing the time taken between handling two consecutive image frames (by using 1 / framerate as delta time) a measurement of the Velocity in the z direction is provided.
The measurement may be chosen to represent a speed of 5 cm / s, 10 cn1 / s, 20 cm / s, or faster (or slower) to differentiate between a fast movement and a slow movement.
In one embodiment the laptop computer 100 is configured to detect if the speed V of the tracked object H and determine whether the detected speed V is above a speed threshold and whether the movement is away from the display 120, the speed V is negative, and , if so, discontinue the tracking of the tracked object H and discontinue the indication of the current position on the display which is currently open for manipulation by the tracked object H.
The user may begin manipulation again by, for example, raising his hand H which is then detected by the laptop computer 100 which indicates the marker area 170, possibly at an initial position and at an initial size.
In one embodiment the laptop computer 100 is configured to determine if the marker area 170 coincides (at least partially) with a displayed object 135 and if the distance D1, D2 between the tracked object H and the displayed object 135 (or display 120) is below a second threshold and if so display an option menu associated with the displayed object 135. As would be understood by a skilled person the distance threshold depends on the computing device and the display size. As would be apparent to a skilled person the exact distances and also the distance threshold is dependent to a large extent on features such as the display size, the camera viewing angle, the angle of the camera with regards to the display and to provide distance thresholds suitable for all possible 10 15 20 25 13 combinations would constitute an exhaustive work effort and not provide for a higher understanding of the manners taught herein. An example of a distance threshold is a distance of 10 cm.
In one example, the displayed object 120 may relate to a media player application and the associated option menu may comprise controls for playing / pausing, skipping forwards / backwards and also possibly volume control, opening a (media) file etc.
In one embodiment the laptop computer 100 is configured to adapt an input interpretation scale based on the distance D1, D2 to the tracked object. The input interpretation scale determinines how the tracked movement should correlate to the movement of the marker area 170. This allows for a user to control the accuracy of an input by moving his hand away from the display 120, thereby enabling for larger movements of the tracked object resulting in smaller movements of the marker area 170 to result in an increased accuracy as larger movements are easier to control and differentiate.
By configuring the laptop computer 100 to adapt the input interpretation scale non-linearly, either continuously or stepwise, the accuracy is further increased.
Figure 5 shows a ch owchart of a general method according to the teachings herein. A computing device detects and tracks 510 an object, such as a hand, and possibly assigns an initial position of a marker area which indicates an operating area.
The computing device changes the displaying properties 515 of the marker area and thereby visually indicates the operating area520 to a user who is able to discem the marker area as it differs from the surrounding displayed content in that for example the contrast and / or brightness is different in the marker area.
When the computing device detects a movement in a direction perpendicular to the display plane (that is towards or away from the display) the displaying properties and / or the extension of the marker area is further changed 540. This allows for a user to more easily discem the marker are and therefore better control any manipulation to be made in the operating area. 10 14 The teachings herein provide the benefit that a user is provided With a visual feedback that is easily discernible and Which does not clutter, hide, obscure or conceal displayed content.
Another benefit lies in that a user is able to vary the feedback and possibly control region in a simple manner.
The invention has mainly been described above With reference to a few embodinients. HoWever, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible Within the scope of the invention, as defined by the appended patent claims.
权利要求:
Claims (19)
[1]
1. A computing device (100, 200) comprising a display (120) and a controller (210), Wherein said controller (210) is con fi gured to: detect and track an object (H) via a video stream (265) provided by a camera (160, 260); and indicate an operating area on the display (120) Which is currently open for manipulation by the tracked object (H) by changing displaying properties of a marker area (170) on the display (120).
[2]
2. The computing device (100, 200) according to claim l, Wherein said displaying properties is the contrast and / or brightness of the marker area (170).
[3]
The computing device (100, 200) according to any preceding claim, Wherein said controller (210) is further configured to indicate the operating area by only changing the displaying properties of the marker area (170).
[4]
4. The computing device (100, 200) according to any preceding claim, Wherein the marker area (170) has an extension (dl, d2) and Wherein said controller (210) is further configured to detect that the tracked object (H) is moved in a direction substantially perpendicular to the plane of the display (120) and in response thereto adapt the marker area (170), by fi irther increasing the displaying properties of the marker area (170) and / or the extension (dl, d2 ) of the marker area (170).
[5]
5. The computing device (100, 200) according to claim 4, Wherein the extension (dl, d2) of the marker area (170) substantially equals the operating area any manipulation effected by a user results in a manipulation of any and all objects (135) Within the marker area (170). 10 15 20 25 30 16
[6]
The computing device (100, 200) according to claim 4, wherein a center of the marker area (170) indicates the area which a user may manipulate and any manipulation effected by a user results in a manipulation of an objects at or adjacent to the center of the marker area 170.
[7]
The computing device (100, 200) according to any of claims 4 to 6, wherein said controller (210) is ther irther configured to adapt the marker area (170) increasingly as a distance (D1, D2) between the tracked object ( H) and the display (120) is increased.
[8]
The computing device (100, 200) according to claim 7, wherein said controller (210) is ther drther configured to detect a face in the vicinity of the tracked object (H) and determine if a distance to the face changes in the same manner as the distance (D1, D2) between the tracked object (H) and the display (120) and if so adapt the marker area (170) increasingly as a distance (D1, D2) between the tracked object (H) and the display (120) is increased.
[9]
The computing device (100, 200) according to any of claims 4 to 8, wherein said controller (210) is further configured to adapt the marker area (170) with respect to a distance (D1, D2) linearly.
[10]
The computing device (100, 200) according to any of claims 4 to 9, wherein said controller (210) is further configured to adapt the marker area (170) with respect to a distance (D1, D2) stepwise.
[11]
11. The computing device (100, 200) according to any preceding claim, wherein said controller (210) is further configured to indicate a marker area 170 at an initial position and / or an initial extension in response to detecting said tracked object (H ) -
[12]
12. The computing device (100, 200) according to any preceding claim, wherein said controller (210) is further configured to detect a speed (V) of the tracked 10 15 20 25 30 17 object (H) and determine whether the detected speed (V) is above a speed threshold and, if so, determine that the tracked movement is an event relating to an object (135) to be manipulated, and, if the detected speed (V) is below the speed threshold determine that the marker area (170) should be adapted.
[13]
13. The computing device (100, 200) according to any preceding claim, wherein said controller (210) is further configured to detect a speed (V) of the tracked object (H) when dependent on claims 1 to 11, and for all claim dependencies to determine whether the detected speed W) is above a speed threshold and whether the movement is away from the display (120), and, if so, discontinue the tracking of the tracked object (H) and discontinue the indication of the operating area on the display (120).
[14]
14. The computing device (100, 200) according to any preceding claim, wherein said controller (210) is further configured to determine if the marker area (170) coincides, at least partially, with a displayed object (135) and if the distance (Dl, D2) between the tracked object (H) and the display (120) is below a second threshold and if so display an option menu associated with the displayed object (135).
[15]
15. The computing device (100, 200) according to any preceding claim, wherein said computing device (100, 200) is a mobile communications terminal.
[16]
16. The computing device (100, 200) according to any preceding claim, wherein said computing device (100, 200) is a computer.
[17]
17. The computing device (100, 200) according to any preceding claim, wherein said computing device (100, 200) is a media device.
[18]
18. A method for use in a computing device (100, 200) comprising a display (120), said method comprising: 10 18 detecting and tracking an object (H) Via a Video stream (265) provided by camera (160, 260 ); and indicating an operating area on the display (120) Which is currently open for manipulation by the tracked object (H) by changing displaying properties of a marker area (170) on the display (120).
[19]
19. A computer readable storage medium (40) encoded With instructions (41) that, When loaded and executed on a controller of a computing device (100, 200), causes the method according to claim 18 to be performed.
类似技术:
公开号 | 公开日 | 专利标题
US9766722B2|2017-09-19|User terminal device and method for controlling the user terminal device thereof
EP2565768B1|2020-12-23|Mobile terminal and method of operating a user interface therein
US8495522B2|2013-07-23|Navigation in a display
US9201521B2|2015-12-01|Storing trace information
US20150363003A1|2015-12-17|Scalable input from tracked object
JP2016538601A|2016-12-08|System, device, and method for displaying picture-in-picture
US20160357428A1|2016-12-08|Display device, display controlling method, and computer program
US10180783B2|2019-01-15|Information processing device, information processing method and program that controls movement of a displayed icon based on sensor information and user input
US9355266B2|2016-05-31|Input by tracking gestures
US20130155108A1|2013-06-20|Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture
US20150346947A1|2015-12-03|Feedback in touchless user interface
US9535493B2|2017-01-03|Apparatus, method, computer program and user interface
US9535604B2|2017-01-03|Display device, method for controlling display, and recording medium
EP3204947A1|2017-08-16|Selecting frame from video on user interface
US20120287063A1|2012-11-15|System and method for selecting objects of electronic device
US20180329612A1|2018-11-15|Interfacing with a computing device
US20150363004A1|2015-12-17|Improved tracking of an object for controlling a touchless user interface
KR20160096645A|2016-08-16|Binding of an apparatus to a computing device
JP2014154908A|2014-08-25|Moving image reproducing apparatus and program
US20160124602A1|2016-05-05|Electronic device and mouse simulation method
JP6484859B2|2019-03-20|Information processing apparatus, information processing method, and program
KR20170125788A|2017-11-15|Methods and systems for positioning and controlling sound images in three-dimensional space
US20140184566A1|2014-07-03|Electronic apparatus, method of controlling the same, and computer-readable recording medium
WO2016035621A1|2016-03-10|Information processing device, information processing method, and program
JP2016167171A|2016-09-15|Electronic apparatus
同族专利:
公开号 | 公开日
SE536989C2|2014-11-25|
CN104937522A|2015-09-23|
US20150346947A1|2015-12-03|
EP2948831A4|2016-12-28|
EP2948831A1|2015-12-02|
WO2014116168A1|2014-07-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

WO2006054207A1|2004-11-16|2006-05-26|Koninklijke Philips Electronics N.V.|Touchless manipulation of images for regional enhancement|
CN101405177A|2006-03-22|2009-04-08|大众汽车有限公司|Interactive operating device and method for operating the interactive operating device|
US8726194B2|2007-07-27|2014-05-13|Qualcomm Incorporated|Item selection using enhanced control|
US8432365B2|2007-08-30|2013-04-30|Lg Electronics Inc.|Apparatus and method for providing feedback for three-dimensional touchscreen|
US20090172606A1|2007-12-31|2009-07-02|Motorola, Inc.|Method and apparatus for two-handed computer user interface with gesture recognition|
US8516397B2|2008-10-27|2013-08-20|Verizon Patent And Licensing Inc.|Proximity interface apparatuses, systems, and methods|
DE102009006082A1|2009-01-26|2010-07-29|Alexander Gruber|Method for controlling selection object displayed on monitor of personal computer, involves changing presentation of object on display based on position of input object normal to plane formed by pressure-sensitive touchpad or LED field|
KR20100113704A|2009-04-14|2010-10-22|삼성전자주식회사|Method and apparatus for selecting an item|
JP5343773B2|2009-09-04|2013-11-13|ソニー株式会社|Information processing apparatus, display control method, and display control program|
US8418237B2|2009-10-20|2013-04-09|Microsoft Corporation|Resource access based on multiple credentials|
EP2395413B1|2010-06-09|2018-10-03|The Boeing Company|Gesture-based human machine interface|
JP5569271B2|2010-09-07|2014-08-13|ソニー株式会社|Information processing apparatus, information processing method, and program|
US8872762B2|2010-12-08|2014-10-28|Primesense Ltd.|Three dimensional user interface cursor control|
US8933876B2|2010-12-13|2015-01-13|Apple Inc.|Three dimensional user interface session control|
KR101896947B1|2011-02-23|2018-10-31|엘지이노텍 주식회사|An apparatus and method for inputting command using gesture|
GB2488785A|2011-03-07|2012-09-12|Sharp Kk|A method of user interaction with a device in which a cursor position is calculated using information from tracking part of the user and an object|
KR20120119440A|2011-04-21|2012-10-31|삼성전자주식회사|Method for recognizing user's gesture in a electronic device|
JP2012248066A|2011-05-30|2012-12-13|Canon Inc|Image processing device, control method of the same, control program and imaging apparatus|
JP6074170B2|2011-06-23|2017-02-01|インテル・コーポレーション|Short range motion tracking system and method|
EP2541383B1|2011-06-29|2021-09-22|Sony Group Corporation|Communication device and method|WO2015022498A1|2013-08-15|2015-02-19|Elliptic Laboratories As|Touchless user interfaces|
US9501810B2|2014-09-12|2016-11-22|General Electric Company|Creating a virtual environment for touchless interaction|
DE102015012720A1|2015-10-01|2017-04-06|Audi Ag|Interactive operator system and method for performing an operator action in an interactive operator system|
CA2957105A1|2016-02-03|2017-08-03|Op-Hygiene Ip Gmbh|Interactive display device|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
SE1350065A|SE536989C2|2013-01-22|2013-01-22|Improved feedback in a seamless user interface|SE1350065A| SE536989C2|2013-01-22|2013-01-22|Improved feedback in a seamless user interface|
US14/761,825| US20150346947A1|2013-01-22|2014-01-22|Feedback in touchless user interface|
EP14742912.0A| EP2948831A4|2013-01-22|2014-01-22|Improved feedback in touchless user interface|
PCT/SE2014/050071| WO2014116168A1|2013-01-22|2014-01-22|Improved feedback in touchless user interface|
CN201480005377.4A| CN104937522A|2013-01-22|2014-01-22|Improved feedback in touchless user interface|
[返回顶部]